Bayes posterior just quick and dirty confidence ?
نویسنده
چکیده
Bayes (1763) introduced the observed likelihood function to statistical inference and provided a weight function to calibrate the parameter; he also introduced a confidence distribution on the parameter space but restricted attention to models now called location models; of course the names likelihood and confidence did not appear until much later: Fisher (1922) for likelihood and Neyman (1937) for confidence. Lindley (1958) showed that the Bayes and the confidence results were different when the model was not location. This paper examines the occurrence of true statements from the Bayes approach and from the confidence approach, and shows that the proportion of true statements in the Bayes case depends critically on the presence of linearity in the model; and with departure from this linearity the Bayes approach can be seriously misleading. Bayesian integration of weighted likelihood provides a first order linear approximation to confidence, but without linearity can give substantially incorrect results.
منابع مشابه
Is Bayes posterior just quick and dirty confidence ? D . A . S . Fraser
Bayes (1763) introduced the observed likelihood function to statistical inference and provided a weight function to calibrate the parameter; he also introduced a confidence distribution on the parameter space but restricted attention to models now called location models; of course the names likelihood and confidence did not appear until much later: Fisher (1922) for likelihood and Neyman (1937)...
متن کاملIs Bayes Posterior just Quick and Dirty Confidence?
Bayes [Philos. Trans. R. Soc. Lond. 53 (1763) 370–418; 54 296–325] introduced the observed likelihood function to statistical inference and provided a weight function to calibrate the parameter; he also introduced a confidence distribution on the parameter space but did not provide present justifications. Of course the names likelihood and confidence did not appear until much later: Fisher [Phi...
متن کاملInvariant Empirical Bayes Confidence Interval for Mean Vector of Normal Distribution and its Generalization for Exponential Family
Based on a given Bayesian model of multivariate normal with known variance matrix we will find an empirical Bayes confidence interval for the mean vector components which have normal distribution. We will find this empirical Bayes confidence interval as a conditional form on ancillary statistic. In both cases (i.e. conditional and unconditional empirical Bayes confidence interval), the empiri...
متن کاملExponential Models: Approximations for Probabilities
Welch & Peers (1963) used a root-information prior to obtain posterior probabilities for a scalar parameter exponential model and showed that these Bayes probabilities had the confidence property to second order asymptotically. An important undercurrent of this indicates that the constant information reparameterization provides location model structure, for which the confidence property ...
متن کاملEnsemble Confidence Estimates Posterior Probability
We have previously introduced the Learn algorithm that provides surprisingly promising performance for incremental learning as well as data fusion applications. In this contribution we show that the algorithm can also be used to estimate the posterior probability, or the confidence of its decision on each test instance. On three increasingly difficult tests that are specifically designed to com...
متن کامل